Convergence of a simple subgradient level method
نویسندگان
چکیده
We study the subgradient projection method for convex optimization with Brr ann-lund's level control for estimating the optimal value. We establish global convergence in objective values without additional assumptions employed in the literature.
منابع مشابه
Proximally Guided Stochastic Subgradient Method for Nonsmooth, Nonconvex Problems
In this paper, we introduce a stochastic projected subgradient method for weakly convex (i.e., uniformly prox-regular) nonsmooth, nonconvex functions—a wide class of functions which includes the additive and convex composite classes. At a high-level, the method is an inexact proximal point iteration in which the strongly convex proximal subproblems are quickly solved with a specialized stochast...
متن کاملA new Levenberg-Marquardt approach based on Conjugate gradient structure for solving absolute value equations
In this paper, we present a new approach for solving absolute value equation (AVE) whichuse Levenberg-Marquardt method with conjugate subgradient structure. In conjugate subgradientmethods the new direction obtain by combining steepest descent direction and the previous di-rection which may not lead to good numerical results. Therefore, we replace the steepest descentdir...
متن کاملConvergence of the Surrogate Lagrangian Relaxation Method1
Mikhail A. Bragin • Peter B. Luh • Joseph H. Yan • Nanpeng Yu • Gary A. Stern Communicated by Fabián Flores-Bazàn Abstract Studies have shown that the surrogate subgradient method, to optimize non-smooth dual functions within the Lagrangian relaxation framework, can lead to significant computational improvements as compared to the subgradient method. The key idea is to obtain surrogate subgradi...
متن کاملConvergence Rates of Min-Cost Subgraph Algorithms for Multicast in Coded Networks
The problem of establishing minimum-cost multicast connections in coded networks can be viewed as an optimization problem, and decentralized algorithms were proposed by Lun et al. to compute the optimal subgraph using the subgradient method on the dual problem. However, the convergence rate problem for these algorithms remains open. There are limited results in the literature on the convergence...
متن کاملRadial Subgradient Descent
We present a subgradient method for minimizing non-smooth, non-Lipschitz convex optimization problems. The only structure assumed is that a strictly feasible point is known. We extend the work of Renegar [1] by taking a different perspective, leading to an algorithm which is conceptually more natural, has notably improved convergence rates, and for which the analysis is surprisingly simple. At ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Math. Program.
دوره 85 شماره
صفحات -
تاریخ انتشار 1999